-
Notifications
You must be signed in to change notification settings - Fork 61
Add native inference with encrypted secrets #381
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: c013a1ae51
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| if (customModelOverride) { | ||
| model = customModelOverride; | ||
| if (bodyData && typeof bodyData === "object" && "model" in bodyData) { | ||
| bodyData = { ...bodyData, model: customModelOverride }; | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Apply custom system prompt even without model override
The new custom_system_prompt metadata is only applied inside the if (customModelOverride) block. That means configuring a system prompt alone (without also setting custom_model) is silently ignored, even though the schema adds custom_system_prompt as a standalone option. If a user supplies only custom_system_prompt, requests will proceed without the intended prompt injection. Consider applying the prompt logic whenever custom_system_prompt is set, regardless of custom_model.
Useful? React with 👍 / 👎.
No description provided.